854 research outputs found
A New Look at the Easy-Hard-Easy Pattern of Combinatorial Search Difficulty
The easy-hard-easy pattern in the difficulty of combinatorial search problems
as constraints are added has been explained as due to a competition between the
decrease in number of solutions and increased pruning. We test the generality
of this explanation by examining one of its predictions: if the number of
solutions is held fixed by the choice of problems, then increased pruning
should lead to a monotonic decrease in search cost. Instead, we find the
easy-hard-easy pattern in median search cost even when the number of solutions
is held constant, for some search methods. This generalizes previous
observations of this pattern and shows that the existing theory does not
explain the full range of the peak in search cost. In these cases the pattern
appears to be due to changes in the size of the minimal unsolvable subproblems,
rather than changing numbers of solutions.Comment: See http://www.jair.org/ for any accompanying file
PAC-Bayesian Bounds for Randomized Empirical Risk Minimizers
The aim of this paper is to generalize the PAC-Bayesian theorems proved by
Catoni in the classification setting to more general problems of statistical
inference. We show how to control the deviations of the risk of randomized
estimators. A particular attention is paid to randomized estimators drawn in a
small neighborhood of classical estimators, whose study leads to control the
risk of the latter. These results allow to bound the risk of very general
estimation procedures, as well as to perform model selection
Hiding solutions in random satisfiability problems: A statistical mechanics approach
A major problem in evaluating stochastic local search algorithms for
NP-complete problems is the need for a systematic generation of hard test
instances having previously known properties of the optimal solutions. On the
basis of statistical mechanics results, we propose random generators of hard
and satisfiable instances for the 3-satisfiability problem (3SAT). The design
of the hardest problem instances is based on the existence of a first order
ferromagnetic phase transition and the glassy nature of excited states. The
analytical predictions are corroborated by numerical results obtained from
complete as well as stochastic local algorithms.Comment: 5 pages, 4 figures, revised version to app. in PR
Development and implementation of blood pressure screening and referral guidelines for German community pharmacists.
Involvement of community pharmacists in the detection and control of hypertension improves patient care. However, current European or North-American guidelines do not provide specific guidance how to implement collaboration between pharmacists and physicians, especially when and how to refer patients with undetected or uncontrolled hypertension to a physician. The German Society of Cardiology and the ABDA - Federal Union of German Associations of Pharmacists developed and tested referral recommendations for community pharmacists, embedded in two guideline worksheets. The project included a guideline-directed blood pressure (BP) measurement and recommendations when patients should be referred to their physician. A "red flag" referral within 4 weeks was recommended when SBP was >140 mm Hg or DBP >90 mm Hg (for subjects 160 mm Hg or >90 mm Hg (≥80 years) in undetected individuals, or >130 mm Hg or >80 mm Hg (140 mm Hg or >80 mm Hg (≥65 years) in treated patients. BP was measured in 187 individuals (86 with known hypertension, mean [±SD] age 62 ± 15 years, 64% female, and 101 without known hypertension, 47 ± 16 years, 75% female) from 17 community pharmacies. In patients with hypertension, poorly controlled BP was detected in 55% (n = 47) and were referred. A total of 16/101 subjects without a history of hypertension were referred to their physician because of uncontrolled BP. Structured BP testing in pharmacies identified a significant number of subjects with undetected/undiagnosed hypertension and patients with poorly controlled BP. Community pharmacists could play a significant role in collaboration with physicians to improve the management of hypertension
Hepatitis A in pediatric acute liver failure in South India
This article does not have an abstract
A Path Algorithm for Constrained Estimation
Many least squares problems involve affine equality and inequality
constraints. Although there are variety of methods for solving such problems,
most statisticians find constrained estimation challenging. The current paper
proposes a new path following algorithm for quadratic programming based on
exact penalization. Similar penalties arise in regularization in model
selection. Classical penalty methods solve a sequence of unconstrained problems
that put greater and greater stress on meeting the constraints. In the limit as
the penalty constant tends to , one recovers the constrained solution.
In the exact penalty method, squared penalties are replaced by absolute value
penalties, and the solution is recovered for a finite value of the penalty
constant. The exact path following method starts at the unconstrained solution
and follows the solution path as the penalty constant increases. In the
process, the solution path hits, slides along, and exits from the various
constraints. Path following in lasso penalized regression, in contrast, starts
with a large value of the penalty constant and works its way downward. In both
settings, inspection of the entire solution path is revealing. Just as with the
lasso and generalized lasso, it is possible to plot the effective degrees of
freedom along the solution path. For a strictly convex quadratic program, the
exact penalty algorithm can be framed entirely in terms of the sweep operator
of regression analysis. A few well chosen examples illustrate the mechanics and
potential of path following.Comment: 26 pages, 5 figure
Image Reconstruction in Multi-Channel Model Under Gaussian Noise
The image reconstruction from noisy data is studied. A nonparametric boundary function is estimated from observations in N independent channels in Gaussian white noise. In each channel the image and the background intensities are unknown. They define a non-identifiable nuisance parameter that slows down the typical minimax rate of convergence. The large sample asymptotics of the minimax risk is found and an asymptotically optimal estimator for boundary function is suggested
Differencing techniques in semi-parametric panel data varying coefficient models with fixed effects: a Monte Carlo study.
Recently, some new techniques have been proposed for the estimation of semi-parametric fixed effects varying coefficient panel data models. These new techniques fall within the class of the so-called differencing estimators. In particular, we consider first-differences and within local linear regression estimators. Analyzing their asymptotic properties it turns out that, keeping the same order of magnitude for the bias term, these estimators exhibit different asymptotic bounds for the variance. In both cases, the consequences are suboptimal non-parametric rates of convergence. In order to solve this problem, by exploiting the additive structure of this model, a one-step backfitting algorithm is proposed. Under fairly general conditions, it turns out that the resulting estimators show optimal rates of convergence and exhibit the oracle efficiency property. Since both estimators are asymptotically equivalent, it is of interest to analyze their behavior in small sample sizes. In a fully parametric context, it is well-known that, under strict exogeneity assumptions the performance of both first-differences and within estimators is going to depend on the stochastic structure of the idiosyncratic random errors. However, in the non-parametric setting, apart from the previous issues other factors such as dimensionality or sample size are of great interest. In particular, we would be interested in learning about their relative average mean square error under different scenarios. The simulation results basically confirm the theoretical findings for both local linear regression and one-step backfitting estimators. However, we have found out that within estimators are rather sensitive to the size of number of time observations
- …